73 research outputs found

    Arousing elements in children’s digital interactive storybook

    Get PDF
    This paper reports an ongoing study on making children’s digital storybook arousing.The problem being addressed in this paper is the lack of proper elements for guiding designers to incorporate into children’s digital storybook. Without the proper elements, designers tend to design digital storybook based on their preferences, instead of those usable to the children.This may lead to discouraging state when children interact with the digital storybook. To address that, this paper aims at determining the most common interface components for digital interactive storybook that make children feel aroused when interacting with. In accomplishing that, a series of field study was carried out, involving a sample of interactive digital storybook.Data were collected from 13 children aged seven to nine through observation and interview.In the end, a set of most common elements that make children aroused when interacting with interactive digital storybook were gathered

    Understanding Link Fabrication Attack in Software Defined Network using Formal Methods

    Get PDF
    The complex nature of SDN coupled with the huge number of services they provide; makes the system as a whole prone to some malicious attacks. An attack such as Link Fabrication Attack (LFA) is becoming more and more common with a high degree of sophistication. Different techniques exist which examine the occurrence of LFA in SDN. To ensure that SDN does not suffer from the same attack from the same origin, we have to consider in general the entire system liability, service information, dependencies, and switch as well as host vulnerabilities. However, due to the complex nature and scalability property of SDN, determining the origin of the LFA is not a straight forward. Although other solutions have been proposed to address this problem in traditional networks, a formal method to security in SDN is lacking. In this paper, we discuss a formal method for SDN using Higher-Order Logic (HOL) and as a case study used to examine the LFA in SDN

    A Comprehensive Review on Adaptability of Network Forensics Frameworks for Mobile Cloud Computing

    Get PDF
    Network forensics enables investigation and identification of network attacks through the retrieved digital content. The proliferation of smartphones and the cost-effective universal data access through cloud has made Mobile Cloud Computing (MCC) a congenital target for network attacks. However, confines in carrying out forensics in MCC is interrelated with the autonomous cloud hosting companies and their policies for restricted access to the digital content in the back-end cloud platforms. It implies that existing Network Forensic Frameworks (NFFs) have limited impact in the MCC paradigm. To this end, we qualitatively analyze the adaptability of existing NFFs when applied to the MCC. Explicitly, the fundamental mechanisms of NFFs are highlighted and then analyzed using the most relevant parameters. A classification is proposed to help understand the anatomy of existing NFFs. Subsequently, a comparison is given that explores the functional similarities and deviations among NFFs. The paper concludes by discussing research challenges for progressive network forensics in MCC

    A Comprehensive Review on Adaptability of Network Forensics Frameworks for Mobile Cloud Computing

    Get PDF
    Network forensics enables investigation and identification of network attacks through the retrieved digital content. The proliferation of smartphones and the cost-effective universal data access through cloud has made Mobile Cloud Computing (MCC) a congenital target for network attacks. However, confines in carrying out forensics in MCC is interrelated with the autonomous cloud hosting companies and their policies for restricted access to the digital content in the back-end cloud platforms. It implies that existing Network Forensic Frameworks (NFFs) have limited impact in the MCC paradigm. To this end, we qualitatively analyze the adaptability of existing NFFs when applied to the MCC. Explicitly, the fundamental mechanisms of NFFs are highlighted and then analyzed using the most relevant parameters. A classification is proposed to help understand the anatomy of existing NFFs. Subsequently, a comparison is given that explores the functional similarities and deviations among NFFs. The paper concludes by discussing research challenges for progressive network forensics in MCC

    Systematic Review on Security and Privacy Requirements in Edge Computing: State of the Art and Future Research Opportunities

    Get PDF
    Edge computing is a promising paradigm that enhances the capabilities of cloud computing. In order to continue patronizing the computing services, it is essential to conserve a good atmosphere free from all kinds of security and privacy breaches. The security and privacy issues associated with the edge computing environment have narrowed the overall acceptance of the technology as a reliable paradigm. Many researchers have reviewed security and privacy issues in edge computing, but not all have fully investigated the security and privacy requirements. Security and privacy requirements are the objectives that indicate the capabilities as well as functions a system performs in eliminating certain security and privacy vulnerabilities. The paper aims to substantially review the security and privacy requirements of the edge computing and the various technological methods employed by the techniques used in curbing the threats, with the aim of helping future researchers in identifying research opportunities. This paper investigate the current studies and highlights the following: (1) the classification of security and privacy requirements in edge computing, (2) the state of the art techniques deployed in curbing the security and privacy threats, (3) the trends of technological methods employed by the techniques, (4) the metrics used for evaluating the performance of the techniques, (5) the taxonomy of attacks affecting the edge network, and the corresponding technological trend employed in mitigating the attacks, and, (6) research opportunities for future researchers in the area of edge computing security and privacy

    Local Descriptor for Retinal Fundus Image Registration

    Get PDF
    A feature-based retinal image registration (RIR) technique aligns multiple fundus images and composed of pre-processing, feature point extraction, feature descriptor, matching and geometrical transformation. Challenges in RIR include difference in scaling, intensity and rotation between images. The scale and intensity differences can be minimised with consistent imaging setup and image enhancement during the pre-processing, respectively. The rotation can be addressed with feature descriptor method that robust to varying rotation. Therefore, a feature descriptor method is proposed based on statistical properties (FiSP) to describe the circular region surrounding the feature point. From the experiments on public Fundus Image Registration dataset, FiSP established 99.227% average correct matches for rotations between 0° and 180°. Then, FiSP is paired with Harris corner, scale-invariant feature transform (SIFT), speeded-up robust feature (SURF), Ghassabi's and D-Saddle feature point extraction methods to assess its registration performance and compare with the existing feature-based RIR techniques, namely generalised dual-bootstrap iterative closet point (GDB-ICP), Harris-partial intensity invariant feature descriptor (PIIFD), Ghassabi's-SIFT, H-M 16, H-M 17 and D-Saddle-histogram of oriented gradients (HOG). The combination of SIFT-FiSP registered 64.179% of the image pairs and significantly outperformed other techniques with mean difference between 25.373 and 60.448% (p = <;0.001*)

    Sequential Monte Carlo Localization Methods in Mobile Wireless Sensor Networks: A Review

    Get PDF
    The advancement of digital technology has increased the deployment of wireless sensor networks (WSNs) in our daily life. However, locating sensor nodes is a challenging task in WSNs. Sensing data without an accurate location is worthless, especially in critical applications. The pioneering technique in range-free localization schemes is a sequential Monte Carlo (SMC) method, which utilizes network connectivity to estimate sensor location without additional hardware. This study presents a comprehensive survey of stateof-the-art SMC localization schemes. We present the schemes as a thematic taxonomy of localization operation in SMC. Moreover, the critical characteristics of each existing scheme are analyzed to identify its advantages and disadvantages. The similarities and differences of each scheme are investigated on the basis of significant parameters, namely, localization accuracy, computational cost, communication cost, and number of samples. We discuss the challenges and direction of the future research work for each parameter

    Efficient and Stable Routing Algorithm Based on User Mobility and Node Density in Urban Vehicular Network

    Get PDF
    Vehicular ad hoc networks (VANETs) are considered an emerging technology in the industrial and educational fields. This technology is essential in the deployment of the intelligent transportation system, which is targeted to improve safety and efficiency of traffic. The implementation of VANETs can be effectively executed by transmitting data among vehicles with the use of multiple hops. However, the intrinsic characteristics of VANETs, such as its dynamic network topology and intermittent connectivity, limit data delivery. One particular challenge of this network is the possibility that the contributing node may only remain in the network for a limited time. Hence, to prevent data loss from that node, the information must reach the destination node via multi-hop routing techniques. An appropriate, efficient, and stable routing algorithm must be developed for various VANET applications to address the issues of dynamic topology and intermittent connectivity. Therefore, this paper proposes a novel routing algorithm called efficient and stable routing algorithm based on user mobility and node density (ESRA-MD). The proposed algorithm can adapt to significant changes that may occur in the urban vehicular environment. This algorithm works by selecting an optimal route on the basis of hop count and link duration for delivering data from source to destination, thereby satisfying various quality of service considerations. The validity of the proposed algorithm is investigated by its comparison with ARP-QD protocol, which works on the mechanism of optimal route finding in VANETs in urban environments. Simulation results reveal that the proposed ESRA-MD algorithm shows remarkable improvement in terms of delivery ratio, delivery delay, and communication overhead

    WDARS: A Weighted Data Aggregation Routing Strategy with Minimum Link Cost in Event-Driven WSNs

    Get PDF
    Realizing the full potential of wireless sensor networks (WSNs) highlights many design issues, particularly the trade-offs concerning multiple conflicting improvements such as maximizing the route overlapping for efficient data aggregation and minimizing the total link cost. While the issues of data aggregation routing protocols and link cost function in a WSNs have been comprehensively considered in the literature, a trade-off improvement between these two has not yet been addressed. In this paper, a comprehensive weight for trade-off between different objectives has been employed, the so-called weighted data aggregation routing strategy (WDARS) which aims to maximize the overlap routes for efficient data aggregation and link cost issues in cluster-based WSNs simultaneously. The proposed methodology is evaluated for energy consumption, network lifetime, throughput, and packet delivery ratio and compared with the InFRA and DRINA. These protocols are cluster-based routing protocols which only aim to maximize the overlap routes for efficient data aggregation. Analysis and simulation results revealed that the WDARS delivered a longer network lifetime with more proficient and reliable performance over other methods

    Development and Analysis of Advanced Image Steganalysis Techniques.

    No full text
    Steganography is the art of providing a secret communication channel for the transmission of covert information. At the same time, it is possible that it can be used by cyber criminals to conceal their works. This potential illegal use of steganography is the basis for the objectives in this thesis. This thesis initially reviews the possible flaws in current implementations of steganalysis. By using images from different camera types, this thesis confirms the expectation that the steganalysis performance is significantly affected by the differences in image sources. In this thesis we prove that image compression in a steganalysis process has an impact on the steganalysis performance, as claimed in the literature. A review of currently available steganalysis techniques, along with a proposal to overcome the said problems is also presented in this thesis. We propose a new technique for steganography that is based on conditional probability statistics. This new technique works on 72 features (conditional probability features) extracted for each image for the purpose of classification. Through experiments based on standard benchmarks, comparable classification accuracies have been achieved by this new approach. Furthermore, these new features demonstrated good performance when applied to image forensic tasks. Applied to images from four digital cameras, these new features are able to classify test images according to their sources with accuracy rates of 99. 5% and 91. 5% in both inter-camera and intra-camera model cases, respectively
    corecore